alently, RLR applies a constraint to the weights
‖ܟ‖ݏ
(4.39)
this constraint, the regression coefficient of one variable
, the regression coefficients of the other variables decrease. For
suppose there are two variables for a RLR model, thus there are
ession coefficients, i.e., ݓଵ and ݓଶ. Because ݓଵ
ଶݓଶ
ଶݏ, if ݓଵ
, ݓଶ decreases. Suppose ݓଵ is increased by ߜ. Because ݓଶ
ଶൌ
he following equation shows the relationship between ݓଶ
ௗ and
., how ݓଶ decreases when ݓଵ increases,
ݓଶ
௪ඥݏെሺݓଵߜሻଶ൏ටݏെݓଵ
ଶൌݓଶ
ௗ
(4.40)
e 4.18 shows an example of how regression coefficients are
ed. When the regression coefficient vector moves from the point
point B, the magnitudes of two regression coefficients change.
the sum of the squares of two regression coefficients is
ed, when ݓଵ increases, ݓଶ decreases accordingly.
The working principle of RLR. When weight vector moves from A to B, the
of two weights change. It is for sure that ݓଶ
൏ݓଶ
when ݓଵ
ݓଵ
.
RLR model, ߣ is a parameter to trade-off between the model
nd the model complexity or the model parsimoniousness. This is
nted in an iterated parameter shrinkage process. When ߣൌ0, a